4,166 research outputs found

    Computing as the 4th “R”: a general education approach to computing education

    Get PDF
    Computing and computation are increasingly pervading our lives, careers, and societies - a change driving interest in computing education at the secondary level. But what should define a "general education" computing course at this level? That is, what would you want every person to know, assuming they never take another computing course? We identify possible outcomes for such a course through the experience of designing and implementing a general education university course utilizing best-practice pedagogies. Though we nominally taught programming, the design of the course led students to report gaining core, transferable skills and the confidence to employ them in their future. We discuss how various aspects of the course likely contributed to these gains. Finally, we encourage the community to embrace the challenge of teaching general education computing in contrast to and in conjunction with existing curricula designed primarily to interest students in the field

    Delivering the benefits of persistence to system construction and execution

    Get PDF
    In an orthogonally persistent programming system the longevity of data is independent of its other attributes. The advantages of persistence may be seen primarily in the areas of data modelling and protection resulting from simpler semantics and reduced complexity. These have been verified by the first implementations of persistent languages, typically consisting of a persistent store, a run-time system and a compiler that produces programs that may access and manipulate the persistent environment. This thesis demonstrates that persistence can deliver many further benefits to the programming process when applied to software construction and execution. To support the thesis, a persistent environment has been extended with all the components necessary to support program construction and execution entirely within the persistent environment. This is the first known example of a strongly-typed integrated persistent programming environment. The keystone of this work is the construction of a compiler that operates entirely within the persistent environment. During its construction, persistence has been exploited in the development of a new methodology for the construction of applications from components and in the optimisation of the widespread use of type information throughout the environment. Further enhancements to software construction and execution have been developed that can only be supported within an integrated persistent programming environment. It is shown how persistence forms the basis of a new methodology for dynamic optimisation of code and data. In addition, new interfaces to the compiler are described that offer increased functionality over traditional compilers. Extended by the ability to manipulate structured values within the persistent environment, the interfaces increase the simplicity, flexibility and efficiency of software construction and execution. Reflective and hyper-programming techniques are also supported. The methodologies and compilation facilities evolved together as the compiler was developed and so the first uses of both were applied to one another. It is these applications that have been described in this thesis as examples of its validity. However, the methodologies and the compilation facilities need not be inter-twined. The benefits derived from each of them are general and they may be used in many areas of the persistent environment

    Managing plagiarism in programming assignments with blended assessment and randomisation.

    Get PDF
    Plagiarism is a common concern for coursework in many situations, particularly where electronic solutions can be provided e.g. computer programs, and leads to unreliability of assessment. Written exams are often used to try to deal with this, and to increase reliability, but at the expense of validity. One solution, outlined in this paper, is to randomise the work that is set for students so that it is very unlikely that any two students will be working on exactly the same problem set. This also helps to address the issue of students trying to outsource their work by paying external people to complete their assignments for them. We examine the effectiveness of this approach and others (including blended assessment) by analysing the spread of similarity scores across four different introductory programming assignments to find the natural similarity i.e. the level of similarity that could reasonably occur without plagiarism. The results of the study indicate that divergent assessment (having more than one possible solution) as opposed to convergent assessment (only one solution) is the dominant factor in natural similarity. A key area for further work is to apply the analysis to a larger sample of programming assignments to better understand the impact of different features of the assignment design on natural similarity and hence the detection of plagiarism

    Program Comprehension: Identifying Learning Trajectories for Novice Programmers

    Get PDF
    This working group asserts that Program Comprehension (PC) plays a critical part in the writing process. For example, this abstract is written from a basic draft that we have edited and revised until it clearly presents our idea. Similarly, a program is written in an incremental manner, with each step being tested, debugged and extended until the program achieves its goal. Novice programmers should develop their program comprehen- sion as they learn to code, so that they are able to read and reason about code while they are writing it. To foster such competencies our group has identified two main goals: (1) to collect and define learning activities that explicitly cover key components of program comprehension and (2) to define possible learning trajectories that will guide teachers using those learning activities in their CS0/CS1 or K-12 courses. [...

    Search for Top Squark Pair Production in the Dielectron Channel

    Get PDF
    This report describes the first search for top squark pair production in the channel stop_1 stopbar_1 -> b bbar chargino_1 chargino_1 -> ee+jets+MEt using 74.9 +- 8.9 pb^-1 of data collected using the D0 detector. A 95% confidence level upper limit on sigma*B is presented. The limit is above the theoretical expectation for sigma*B for this process, but does show the sensitivity of the current D0 data set to a particular topology for new physics.Comment: Five pages, including three figures, submitted to PRD Brief Report

    Search for a Fourth Generation Charge -1/3 Quark via Flavor Changing Neutral Current Decay

    Get PDF
    We report on a search for pair production of a fourth generation charge -1/3 quark (b') in pbar p collisions at sqrt(s) = 1.8 TeV at the Fermilab Tevatron using an integrated luminosity of 93 pb^-1. Both quarks are assumed to decay via flavor changing neutral currents (FCNC). The search uses the signatures gamma + 3 jets + mu-tag and 2 gamma + 2 jets. We see no significant excess of events over the expected background. We place an upper limit on the production cross section times branching fraction that is well below theoretical expectations for a b' quark decaying exclusively via FCNC for b' quark masses up to m(Z) + m(b).Comment: Eleven pages, two postscript figures, submitted to Physical Review Letter

    Measurement of the Top Quark Pair Production Cross Section in pbarp Collisions

    Get PDF
    We present a measurement of the ttbar production cross section in ppbar collisions at root(s) = 1.8TeV by the D0 experiment at the Fermilab Tevatron. The measurement is based on data from an integrated luminosity of approximately 125 pb^-1 accumulated during the 1992-1996 collider run. We observe 39 ttbar candidate events in the dilepton and lepton+jets decay channels with an expected background of 13.7+-2.2 events. For a top quark mass of 173.3GeV/c^2, we measure the ttbar production cross section to be 5.5+-1.8 pb.Comment: 11 pages with 3 encapsulated PostScript figures and 2 PostScript table included in the body of the articl

    Search for R-parity Violating Supersymmetry in Dimuon and Four-Jets Channel

    Get PDF
    We present results of a search for R-parity-violating decay of the neutralino chi_1^0, taken to be the Lightest Supersymmetric Particle. It is assumed that this decay proceeds through one of the lepton-number violating couplings lambda-prime_2jk (j=1,2; k=1,2,3). This search is based on 77.5 pb-1 of data, collected by the D0 experiment at the Fermilab Tevatron in ppbar collisions at a center of mass energy of 1.8 TeV in 1992-1995.Comment: 10 pages, 3 figure

    Measurement of the WW Boson Mass

    Full text link
    A measurement of the mass of the WW boson is presented based on a sample of 5982 WeνW \rightarrow e \nu decays observed in ppp\overline{p} collisions at s\sqrt{s} = 1.8~TeV with the D\O\ detector during the 1992--1993 run. From a fit to the transverse mass spectrum, combined with measurements of the ZZ boson mass, the WW boson mass is measured to be MW=80.350±0.140(stat.)±0.165(syst.)±0.160(scale)GeV/c2M_W = 80.350 \pm 0.140 (stat.) \pm 0.165 (syst.) \pm 0.160 (scale) GeV/c^2.Comment: 12 pages, LaTex, style Revtex, including 3 postscript figures (submitted to PRL

    Measurement of Dijet Angular Distributions and Search for Quark Compositeness

    Get PDF
    We have measured the dijet angular distribution in s\sqrt{s}=1.8 TeV ppˉp\bar{p} collisions using the D0 detector. Order αs3\alpha^{3}_{s} QCD predictions are in good agreement with the data. At 95% confidence the data exclude models of quark compositeness in which the contact interaction scale is below 2 TeV.Comment: 11 pages, Latex, 3 postscript figure
    corecore